Non-Convex Projected Gradient Descent for Generalized Low-Rank Tensor Regression
نویسندگان
چکیده
In this paper, we consider the problem of learning high-dimensional tensor regression problems with low-rank structure. One of the core challenges associated with learning high-dimensional models is computation since the underlying optimization problems are often non-convex. While convex relaxations could lead to polynomialtime algorithms they are often slow in practice. On the other hand, limited theoretical guarantees exist for non-convex methods. In this paper we provide a general framework that provides theoretical guarantees for learning high-dimensional tensor regression models under different low-rank structural assumptions using the projected gradient descent algorithm applied to a potentially non-convex constraint set Θ in terms of its localized Gaussian width. We juxtapose our theoretical results for non-convex projected gradient descent algorithms with previous results on regularized convex approaches. The two main differences between the convex and non-convex approach are: (i) from a computational perspective whether the non-convex projection operator is computable and whether the projection has desirable contraction properties and (ii) from a statistical upper bound perspective, the non-convex approach has a superior rate for a number of examples. We provide three concrete examples of low-dimensional structure which address these issues and explain the pros and cons for the non-convex and convex approaches. We supplement our theoretical results with simulations which show that, under several common settings of generalized low rank tensor regression, the projected gradient descent approach is superior both in terms of statistical error and run-time provided the step-sizes of the projected descent algorithm are suitably chosen.
منابع مشابه
Fast low-rank estimation by projected gradient descent: General statistical and algorithmic guarantees
Optimization problems with rank constraints arise in many applications, including matrix regression, structured PCA, matrix completion and matrix decomposition problems. An attractive heuristic for solving such problems is to factorize the low-rank matrix, and to run projected gradient descent on the nonconvex factorized optimization problem. The goal of this problem is to provide a general the...
متن کاملSpectral Compressed Sensing via Projected Gradient Descent
Let x ∈ C be a spectrally sparse signal consisting of r complex sinusoids with or without damping. We consider the spectral compressed sensing problem, which is about reconstructing x from its partial revealed entries. By utilizing the low rank structure of the Hankel matrix corresponding to x, we develop a computationally efficient algorithm for this problem. The algorithm starts from an initi...
متن کاملImage models for segmentation and recognition
We examine image models for segmentation and classification that are based (i) on the statistical properties of natural images and (ii) on non-negative matrix or tensor codes. Regarding (i) we derive a parametric framework for variational image segmentation. Using a model for the filter response statistics of natural images we build a sound probabilistic distance measure that drives level sets ...
متن کاملProvable non-convex projected gradient descent for a class of constrained matrix optimization problems
We consider the minimization of convex loss functions over the set of positive semi-definite(PSD) matrices, that are also norm-constrained. Such criteria appear in quantum state tomographyand phase retrieval applications, among others. However, without careful design, existing methodsquickly run into time and memory bottlenecks, as problem dimensions increase.However, the ma...
متن کاملA Universal Variance Reduction-Based Catalyst for Nonconvex Low-Rank Matrix Recovery
We propose a generic framework based on a new stochastic variance-reduced gradient descent algorithm for accelerating nonconvex low-rank matrix recovery. Starting from an appropriate initial estimator, our proposed algorithm performs projected gradient descent based on a novel semi-stochastic gradient specifically designed for low-rank matrix recovery. Based upon the mild restricted strong conv...
متن کامل